Search Results: "gray"

1 December 2010

Gintautas Miliauskas: Windows (II): User Interface

(This is a part of a series of posts on my recent experience with Windows. See Windows (I) for the first post.)

The Windows user interface is definitely acceptable. The charge that it is too colorful or toy-like is completely unfounded. Rather, perhaps geeks should spend less time looking at their gray Motif boxy controls. In terms of speed, the UI is generally more responsive than on Linux, though maybe less so than previous versions of Windows (that could be because my Intel video IGP is a slouch).

An interesting observation is that apps are almost as heterogeneous in terms of interface as in GNU/Linux, even though the standard controls are ubiquitous. Even apps by Microsoft can be separated into different "generations" of UI (e.g., folder windows, Control Panel, Microsoft Office). Also it is clear that generally more attention is paid to UI and usability by app developers, although perhaps not as much as in Apple products.

The greatest asset and the worst offender at the same time, by far, is the overall GUI orientation of the system. Needless to say, it aids discoverability, but reduces scripting capabilities. Problem is, even if you do not need to write scripts per se, command-line actions are useful because they can be repeated and chained very easily, using shell history. In Windows, I occasionally find myself doing much repetitive clicking that would likely be an "Alt-Tab Up Enter" (or sometimes just one key) sequence in Linux. Moreover, the lack of good standartized scripting is a huge pain during app deployments which tend be repetetive. I did not like Windows on servers before, and I do not like it now.

To be fair, my complaints about scripting capabilities may be partly moot because I have only used vanilla Windows batch files, and have not looked at all into Windows PowerShell, which is a new-generation scripting tool. The examples are fairly impressive. The language has a nice look and is clearly powerful, but also looks somewhat complicated which keeps me away until I find a good reason to learn it.

Initially I missed workspaces. The one app I found that was supposed to emulate workspaces would take several seconds to switch the workspace, so it was completely unusable. I learned to do without them rather quickly though. Actually I'm starting to think that using many workspaces is a sign that you are doing too many things at once and also they are an invitation to distraction (how many of you have a "blog reader" workspace right aside your "work" workspace, ready at your fingertips at a moment's notice?). Workspaces do have one killer feature: you can jump directly to a given workspace instead of cycling through windows. This is very useful when working with more than two apps, in which case without workspaces you are forced to think about the morphing Alt-Tab queue when switching windows.

Speaking about UI generations, I much prefer the ribbon toolbars of the new Office. The still-prevalent toolbars with zillions of old-style 32x32 (or 16x16) toolbar icons are really ugly. Typically only a few of the toolbar buttons are actually useful, and they make the user interface unnecessarily cramped and busy. The move to fewer and larger toolbar buttons is definitely on the right track.

The explorer context menus also tend to grow crazy long. Every app wants to get in there, and in the end you have a context menu that takes up half a screen vertically. Needless to say, most of those items are not used much. Sure, you can opt out during app installation, but at that time it is difficult to say how useful the context menu item will be, and there's no easy way (i.e. easier than just ignoring the cruft) to remove the entries afterwards.

In my coming posts I will cover package management and application development on Windows.

23 November 2010

Julien Danjou: Color contrast correction

I finally took some time to finish my color contrast corrector. It's now able to compare two colors and to tell if they are readable when used as foreground and background color for text rendering. If they are too close, the code corrects both colors so to they'll become distant enough to be readable. To do that, it uses color coordinates in the CIE L*a*b* colorspace. This allows to determine the luminance difference between 2 colors very easily by comparing the L component of the coordinates. The default threshold used to determine readability based on luminance difference is 40 (on 100), which seems to give pretty good results so far. Then it uses the CIE Delta E 2000 formula to obtain the distance between colors. A distance of 6 is considered to be enough for the colors to be distinctive in our case, but that can be adjusted anyway. That depends on reader's eyes. If both the color and luminance distances are big enough, the color pair is considered readable when used upon each other. If these criteria are not satisfied, the code simply tries to correct the color by adjusting the L (luminance) component of the colors so their difference is 40. Optionally, the background color can be fixed so only the foreground color would be adjusted; this is especially handy when the color background is not provided by any external style, but it the screen one (like the Emacs frame background in my case). Here is an example result generated over 10 pairs of random colors. Left colors are randomly generated, and right colors are the corrected one. <style type="text/css"> </style>

bg: DarkSeaGreen4 fg: gray67 ->                             fg: #4a6b4b bg: #cccccc
bg: SlateGray4 fg: forest green ->                          fg: #9faec0 bg: #005700
bg: grey13 fg: grey36 ->                                    fg: #131313 bg: #6c6c6c
bg: MediumPurple2 fg: honeydew ->                           fg: #9e78ed bg: #f0fff0
bg: grey43 fg: chartreuse3 ->                               fg: #5e5e5e bg: #79de25
bg: linen fg: DeepPink2 ->                                  fg: linen bg: DeepPink2
bg: CadetBlue4 fg: blue1 ->                                 fg: #6c9fa4 bg: #0000e1
bg: gray33 fg: NavajoWhite3 ->                              fg: #525252 bg: #cfb58c
bg: chartreuse1 fg: RosyBrown3 ->                           fg: #9cff38 bg: #b28282
bg: medium violet red fg: DeepPink1 ->                      fg: #9c0060 bg: #ff55b9
All this has been written in Emacs Lisp. The code is now available in Gnus (and therefore in Emacs 24) in the packages color-lab and shr-color. A future work would be to add support for colour blindness. As a side note, several people pointed me at the WCAG formulas to determine luminance and contrast ratio. These are probably good criteria to choose your color when designing a user interface. However, they are not enough to determine if displayed color will be readable. This means you can use them if you are a designer, but IMHO they are pretty weak for detecting and correcting colors you did not choose. Flattr this

10 November 2010

Luca Bruno: Online sprints, or how to revive a L10N team

The Italian L10N team has not been very active nor growing in recent years. In particular, we pretty much failed at attracting new members in our team, with the result that untranslated files are piling up and manpower is scarce. Following a suggestion of our uber-active Francesca, we decided to try a new move to invert the trend: organizing brief weekly online sprints open to everybody, where graybeard translators will help newcomers getting to grips on Debian L10N infrastructure while collaboratively working on yet-untranslated targets. Last week, we tried our first and very introductory sprint, with a preliminary meeting on IRC to give instructions and setup ad-hoc pads. As a result, we ended with linux-2.6 po-debconf and a web-page completely translated and proofread by almost fifteen people in just a couple a hours. The key point however is that the majority of participants were fresh L10N-newbies, which we hope will join us permanently very soon after this first contact. Encouraged by the initial positive result, we already announced our next sprint for Thursday 11th, which will be focused on package descriptions translation (preceded by a crash course on DDTSS, its related web interface). We hope that even more users will join us this time, and encourage other stalled translation teams in experimenting a similar approach to revive activity and encourage participation.

12 September 2010

Dirk Eddelbuettel: Chicago Half Marathon 2010

Second Sunday in September -- time for the annual Chicago Half Marathon now in its fourteenth edition (and I have been running it in 2003, 2004, 2005, 2006, 2007, 2008 and 2009 making this and the JPM Chase Corporate Challenge the races I've run most often). And the course was altered this year, alongside with an earlier start as the Bears have their season home opener today. So we started north towards 57th, then running down to 67th, turning east towards the lake at around three miles --- and having the remaining ten miles along the lakefront running up to 31st (the usual turn) and back down to 63rd. I like this course better; let's hope it sticks. Race conditions were fantastic. We had a rainy and gray day yesterday but today is pure bliss. Temperatures around 60 degrees at the 7:00am start, no wind, sunshine and not a cloud in the sky. The race itself went well. I had a pretty brutal running year suffering most of the time from some archilles tendon inflammation. It has gotten better in the last few weeks possibly thanks to some heel cups I now put in the shoes. But I had exactly one run longer than ten miles since the Boston Marathon. So I lost a lot of speed, as well as endurance and was a little nervous as to how I'd do. And considering all this, it went pretty well. I fininished in 1;41:50 or a 7:47 pace. While is easily the slowest half in a number of years, at least I got to run it evenly, pain free and with a negative split (== faster second half) and some gas left for a fast last half mile or so. So maybe I don't have to retire from running just yet. We'll see if I get some speed back in 2011.

7 September 2010

Biella Coleman: Ireland

This summer of 2010 has been memorable. It started with a difficult period following the hospitalization and death of my mother, a series of very intense and equally memorable conferences catapulting me out of my funk and ending with a trip to Ireland, perhaps one of my most pleasant trips ever. I have always wanted to go there, as I have some good Irish friends and I was also quite attracted to the place due to its history, so when the opportunity came for me to go, I did not hesitate to book my ticket. I was not left disappointed in any way, shape or form, although since I barely experienced the gray, misty, and rainy weather Irish is famous for, my experience may admittedly be a bit skewed. These are some of things I did and some of my fragmented thoughts about Ireland and some photos, proof that the weather was UnIrish Ireland and The Irish: Well I can t as no one can speak of The Irish as if they were some unitary group but I did learn a lot about Irish history and managed hang out with a number Irish folks (even a family) and one thing that seems to mark Ireland as distinct, what makes it stand out from the rest of its Western European brothers and sisters, is the pervasive sense of history bleeding into the ambiance, perhaps because it is so tragic. The short version of the history, if you don t know it, is that the Irish, especially the Catholics, got repeatedly screwed by the British monarchs/rulers/planters/government/ for nearly a thousand years, the last five hundred of those being particularly harsh and ugly, a cycle of slight gains crushed by various forms of tyranny and violence at least until part of the country achieved independence (Northern Ireland is a bit of a different story). I may have gone a bit out of my way to learn about Irish history, more than I have done for any other place, but this historical consciousness seemed to be inescapable, precipitating into all sorts of conversations and places. To take one example, I went to see Gaelic Football, one of the two beloved national sports (the other being hurling), and the minute you learn anything about this sport, you learn that it is intimately bound with the Irish fight for independence and nationalism. The Irish are also very warm, kind, and outgoing. They also seem to curse an awful lot as well, so much so cursing is a bit of a national pastime, which yes, I (f*cking) loved as I tend to have a bit of a foul mouth myself, curbed I will admit, in recent years and in the classroom. It crept up in a lot of places but was most pronounced during the All Ireland Semi-Final Gaelic football game when the ladies (not lads, mind you) behind me were constantly yelling at the referee, hurling the c-word (rhymes with trunk) whenever they made a call they disagreed with. EASA/Maynooth.: I went to Ireland to attend the largest Anthropological meetings in Europe, and in specific an all day panel on digital anthropology, which seemed like a great opportunity given we are a a bit of a minority. The conference was impressively large with roughly 1200 attendees (can you believe there are that many anthropologists?), smoothly run, and the all-day panel on digital media was quite lively and I got to meet some really interesting folks. I was a tad sad to find out Maynooth is the only university in Ireland with an Anthropology department (for crying shame lads!!!) but at least it is located in a darn stunning university: the old quarters of the campus are strikingly beautiful. Anonymous: I have done some work on Anonymous and well when I found out there was going to be a raid/protest at the Church of Scientology (a pretty dismal, and run down church), I got in contact with Irish anon to let them know I was coming. Although someone first decapitated me (at least in character with their norms, right?), when I showed up in person, they were not only civil but really quite hospitable (greeting me with one of my favorite songs). Overall it was a great day. I was reminded of important differences among Anons (Irish Anon s take their anonymity pretty seriously, the New York Anons do not) and also good to experience the social life and metabolism of a protest, especially one attended by folks who have lost family to the church. Dublin: Since I stayed with my friend and his family in Dublin, this is where I spent most of my time. I was able to hook up with various friends, including one from graduate school who just got back from years of fieldwork in Rwanda and hearing about his experiences and stunning but stunningly sad project made me feel like mine in comparison was Child s Play (in fact, it really was). I got to see the Debian crew (many who work at Google) and I finally paid a visit to the office, which was exactly how I imagined it to be (good and abundant food, good lighting, lots of toys and bikes, lots of Star Wars posters.. Yep, it could have been in Silicon Valley). But I was surprised at the young age of the marketing and sales folks who were hanging in their lounge when I ran into them. In fact when I saw them I thought like I was looking at my freshman class or something! It was great to see the Debian folks (though no one I met was actually Irish), as well, one of my favorite things to do whenever I visit a foreign city. I walked my heart out in the city getting a blister in a shoe that I thought was blister proof and while not as picturesque as some other European cities, it has a ton of character and no shortage of Guinness and pubs (no surprise there). My favorite places/things were: The National Library (great exhibit on Yeats, but make sure to use the multi-media as that is where all the information is stuffed), St. Stephen s Park (overflowing with chubby ducks and lovely flowers), the prison Kilmainham Gaol (would not advise a visit if you are feeling in any way down, there is some heavy shit you learn during the tour), the simple stained glass that seemed pretty common, and finally the Long Room in Trinity Church, which you enter after the Book of Kells (I realized just how much I adore books when I visited this old library stuffed from floor to ceiling with old old old books). The West Coast: I did not think I was going to head out west but after hurricane Earl started its burst along the eastern Seaboard and I was able to change my ticket for free so I stayed a few more days. I went to the Burren and the Cliffs of Moher both totally stunning, really majestic. As is often the case with these type of these natural wonders, I am often left elated and awed but such strikingly wondrous places also seem to subsequently spur a more melancholic state of mind and heart. Friends, Family, and Dogs: While in Dublin I stayed with my friend A. and his extremely hospitable family, which included, a brother, a father, and three Irish mutts, one of which, Buster (pitt bull/lab mix), pretty much stole my heart. Buster s true love, is food, so much so he almost poised himself to death a little while back snorting down something he shouldn t have costing the family a pretty penny to save him. My friend no longer lives there but came from Berlin and it was a real treat to not only spend days layered upon each other with a friend (it has been an awful long since I have done that outside of conferences) but also meet his family. You learn a lot about your friends that way and in this case, there is some serious and I mean serious intellectual jousting that happens, sometimes bordering on warfare but generally it plays out in more contained, civil and fascinating fashion. Now I understand why my friend is armed with seemingly endless knowledge: it was needed for purposes of defense at home. So in essence, a great, great trip and a fantastic way to end a memorable summer and transition into what I hope will be a bit of a monkish (I call it monk mode) period for this academic year. I am (so so so) fortunate enough to have a fellowship at the Institute for Advanced Study and am going to try my darnest to take advantage of the fact I am not teaching (Hell Yes!) and hide away and accomplish all that I have set out to do.

4 July 2010

Torsten Landschoff: Postprocessing conference videos

I was planning to attend DebConf New York this year, but for a number of reasons I decided not to go. Fortunately, Ferdinand Thommes organized a MiniDebConf in Berlin at LinuxTag and I managed to attend. Thanks, Ferdinand! There were a number of interesting Talks. I especially liked the talk of our DPL, and those about piuparts and git-buildpackage. In contrast to the other LinuxTag talks, we had a livestream of our talks and recorded (most) of them. The kudos for setting this up goes to Alexander Wirt, who spent quite a few hours to get it up and running. I have to apologize for being late in bringing my Notebook, which was intended to do the theora encoding of the livestream. This was a misunderstanding on my part, I should have known that this is not going to be setup in the night before show time So to compensate the extra hours he had to put in for me, I offered to do the post processing of the videos. Basic approach for post processing The main goal of post processing the videos was (of course) to compress them to a usable size from the original 143 GB. I also wanted to have a title on each video, and show the sponsors at the end of the video. My basic idea to implement that consisted of the following steps:
  1. Create a title animation template.
  2. Generate title animations from template for all talks.
  3. Use a video editor to create a playlist of the parts title talk epilogue.
  4. Run the video editor in batch mode to generate the combined video.
  5. Encode the resulting video as ogg theora.
As always with technology, it turned out that the original plan needed a few modifications. Title animations
<video controls="controls" height="288" src="http://www.landschoff.net/blog/uploads/2010/07/mdc2010_title_anim1.ogv" width="360 "></video>
Originally I wanted to use Blender for the title animation, but I knew it is quite a complicated bit of software. So I looked for something simpler, and stumbled across an article that pointed me towards Synfig Studio for 2D animation. This is also in Debian, so I gave it a try. I was delighted that Synfig Studio has a command line renderer which is just called synfig and that the file format is XML, which would make it simple to batch-create the title animations. My title template can be found in this git repository. Batch creation of title animations I used a combination of make and a simple python script to replace the author name and the title of the talk into the synfig XML file. The data for all talks is another XML file talks.xml. Basically, I used a simple XPath expression to find the relevant text node and change the data using the ElementTree API of lxml python module. The same could be done using XSLT of course (for a constant replacement, see this file) but I found it easier to combine two XML files in python. Note that I create PNG files with synfig and use ffmpeg to generate a DV file from those. Originally, I had synfig create DV files directly but those turned out quite gray for some reason. I am now unable to reproduce this problem. Combining the title animation with the talk For joining the title animation with the talk, I originally went with OpenShot, which somebody of the video team had running at the conference. My idea was to mix a single video manually and just replace the underlying data files for each talk. I expected that this would be easy using the openshot-render command, which renders the output video from the input clips and the OpenShot project file. However, OpenShot stores the video lengths in the project file and will take those literally, so this did not work for talks of different play times I considered working with Kino or Kdenlive but they did not look more appropriate for this use case. I noticed that OpenShot and Kdenlive both use the Media Lovin Toolkit under the hood, and OpenShot actually serializes the MLT configuration to $HOME/.openshot/sequence.xml when rendering. I first tried to read that XML file from python (using the mlt python bindings from the python-mlt2 package) but did not find an API function to do that. So I just hard coded the video sequence in python. I ran into a few gotchas on the way: Things to improve While the results look quite okay for me now, there is a lot of room for improvement. Availability

13 January 2010

Matt Brubeck: Finding SI unit domain names with Node.js

I'm working on some ideas for finance or news software that deliberately updates infrequently, so it doesn't reward me for checking or reloading it constantly. I came up with the name "microhertz" to describe the idea. (1 microhertz once every eleven and a half days.) As usual when I think of a project name, I did some DNS searches. Unfortunately "microhertz.com" is not available (but "microhertz.org" is). Then I went off on a tangent and got curious about which other SI units are available as domain names. This was the perfect opportunity to try node.js so I could use its asynchronous DNS library to run dozens of lookups in parallel. I grabbed a list of units and prefixes from NIST and wrote the following script:
var dns = require("dns"), sys = require('sys');
var prefixes = ["yotta", "zetta", "exa", "peta", "tera", "giga", "mega",
  "kilo", "hecto", "deka", "deci", "centi", "milli", "micro", "nano",
  "pico", "femto", "atto", "zepto", "yocto"];
var units = ["meter", "gram", "second", "ampere", "kelvin", "mole",
  "candela", "radian", "steradian", "hertz", "newton", "pascal", "joule",
  "watt", "colomb", "volt", "farad", "ohm", "siemens", "weber", "henry",
  "lumen", "lux", "becquerel", "gray", "sievert", "katal"];
for (var i=0; i<prefixes.length; i++)  
  for (var j=0; j<units.length; j++)  
    checkAvailable(prefixes[i] + units[j] + ".com", sys.puts);
   
 
function checkAvailable(name, callback)  
  var resolution = dns.resolve4(name);
  resolution.addErrback(function(e)  
    if (e.errno == dns.NXDOMAIN) callback(name);
   )
 
Out of 540 possible .com names, I found 376 that are available (and 10 more that produced temporary DNS errors, which I haven't investigated). Here are a few interesting ones, with some commentary: To get the complete list, just copy the script above to a file, and run it like this: node listnames.js Along the way I discovered that the API documentation for Node's dns module was out-of-date. This is fixed in my GitHub fork, and I've sent a pull request to the author Ryan Dahl.

22 September 2009

Steve Kemp: Hack the planet!

Recently I was viewing Planet Debian and there was an entry present which was horribly mangled - although the original post seemed to be fine. It seemed obvious to me that that some of the filtering which the planet software had applied to the original entry had caused it to become broken, malformed, or otherwise corrupted. That made me wonder what attacks could be performed against the planet aggregator software used on Planet Debian.
Originally Planet Debian was produced using the planet software. This was later replaced with the actively developed planet-venus software instead. (The planet package has now been removed from Debian unstable.)
Planet, and the Venus project which forked from it, do a great job at scrutinising their input and removing malicious content. So my only hope was to stumble across something they had missed. Eventually I discovered the (different) filtering applied by the two feed aggregators missed the same malicious input - an image with a src parameter including javascript like this:
<img src="javascript:alert(1)">
When that markup is viewed by some browsers it will result in the execution of javascript. In short it is a valid XSS attack which the aggregating software didn't remove, protect against, or filter correctly.
In fairness it seems most of the browsers I tested didn't actually alert when viewing that code - but as a notable exception Opera does. I placed a demo online to test different browsers: If your browser executes the code there, and it isn't Opera, then please do let me know!
The XSS testing of planets Rather than produce a lot of malicious input feeds I constructed and verified my attack entirely off line. How? Well the planet distribution includes a small test suite, which saved me a great deal of time, and later allowed me to verify my fix. Test suites are good things. The testing framework allows you to run tiny snippets of code such as this:
# ensure onblur is removed:
HTML( "<img src=\"foo.png\" onblur=\"alert(1);\" />",
      "<img src=\"foo.png\" />" );;
Here we give two parameters to the HTML function, one of which is the input string, and the other is the expected output string - if the sanitization doesn't produce the string given as the expected result an error is raised. (The test above is clearly designed to ensure that the onblur attribute and its value is removed.) This was how I verified initially that the SRC attribute wasn't checked for malicious content and removed as I expected it to be. Later I verified this by editing my blog's RSS feed to include a malicious, but harmless, extra section. This was then shown upon the Planet Debian output site for about 12 hours. During the twelve hour window in which the exploit was "live" I received numerous hits. Here's a couple of log entries (IP + referer + user-agent):
xx.xx.106.146 "http://planet.debian.org/" "Opera/9.80
xx.xx.74.192  "http://planet.debian.org/" "Opera/9.80
xx.xx.82.143  "http://planet.debian.org/" "Opera/9.80
xx.xx.64.150  "http://planet.debian.org/" "Opera/9.80
xx.xx.20.18   "http://planet.debian.net/" "Opera/9.63
xx.xx.42.61   "-"                         "gnome-vfs/2.16.3
..
The Opera hits were to be expected from my previous browser testing, but I'm still not sure why hits were with from User-Agents identifying themselves as gnome-vfs/n.n.n. Enlightenment would be rewarding. In conclusion the incomplete escaping of input by Planet/Venus was allocated the identifier CVE-2009-2937, and will be fixed by a point release. There are a lot of planets out there - even I have one: Pluto - so we'll hope Opera is a rare exception. (Pluto isn't a planet? I guess thats why I call my planet a special planet ;) ObFilm: Hackers.

18 May 2009

Gunnar Wolf: Summertime, summertime!

Summertime, summertime!
Finally, temperatures are back to sanity, and the air is no longer a mass of dust. For us Mexicans (well, for us central-Mexicans), spring is the hot, dry season. In spring, we usually reach up to 30 C... Not too much for many people, but hellish for me. Summer is much more palatable, having decently sunny days, cloudy evenings and a nice shower as it gets dark. Even if it is a bit harder for a cyclist to take the streets, I think this is my favorite season. This year, we had several (and early!) false starts. I have come to hate April/May's heath waves (that means I am usually grumpy for my birthday). But finally, we have had almost a week of cloudy skies, with almost daily rains. That makes me happy. But I am even happier when I come to work to my office, look out the window, and am not greeted anymore with a 5Km visibility range, but with 20Km. Mind you, this photo was taken past 5PM, when the sky turns nice and gray (just as I've proclaimed my life to be). According to the Weather Channel's information on Mexico City, we currently have 16.1Km Which makes sense, as I can no longer see the Chiquihuite mountain marking the Northern part of the city, but I can perfectly see some buildings near Polanco. The view earlier in the day was... Beautiful, partly perhaps because of the contrast just with last week's. Oh, and yes - Once again, thank you very much my dear University. Instead of the closed cubicles many people spend their hours in, this relaxing view was taken from my office's window. I love it!

28 March 2009

Patrick Schoenfeld: Arcor EasyBox A801

Recently, my girl friend and I decided to get Arcor DigitalTV. So we got some hardware from Arcor (and unfortunately we needed to "upgrade" our real-ISDN-connection to a SIP-based-one):

That second piece of hardware is kinda interesting, because it is an ADSL2+-Modem, (WLAN-)Router, SIP-Gateway and file-/printserver (it has one USB port which can be used to connect an USB hub and up to 4 disks or 1 printer) and 4-Port Switch in one (I think it doesn't include whats called a 'Splitter', because with this kind of connection there is no ISDN signal to be splitted from the internet signal).

Unfortunately not all is well about this (or about my provider). This piece of hardware can be configured with a modem installation code. Which is good for an average user, because he does not need to care about anything. With this process the hardware configures itself by receiving configuration data from a configuration server of the provider. Regardless about other implications this might have: What disturbed me about this is, that I loose control about my router, if I use this. Because when configured this way some sites in the router are grayed out. If you click them you get a message that this setting is controlled by your provider. Uarg. Well, unfortunately IPTV from arcor does not simply use the same internet connection as I usually do (more about this later), so I were suffering from missing informations about how to configure the router manually.
Arcor wasn't particular helpful in this, because they told us "Der Router kann nicht manuell konfiguriert werden." (which means in Englisch: "The router cannot be configured manually."). Although this brought me to laugh, I were also disappointed and worried about this. So I decided to find it out myself. I won't tell what I tried after all, but in the end I found out that the modem installation code-installation does configure:

With this knowledge (and some good guessing, e.g. that both PPPoE links eventually use the same user data) I were able to configure the router fully myself, which also enables me to set QoS settings how I want them (which is why I went through the whole torture at all, because SSH was extremely laggish when watching TV). One thing is notable however, how I got the information:

The manufacturer of the router seems to do some things to protect the settings from the user. For example: The page for configuring the WAN is called wan_main.stm, it can be called without trouble if you are configureing the router manually, but the system would block access when you use the modem installation code. BUT and thats how I got the info that the third link is a MAC Encapsulation Link: The status page (where it shows that you are connected etc.) includes javascript vars for nearly everything you ever wanted to know about your router configuration. Not obfuscated after all. You just have to look at the source of the status frame. Thats real professional, Arcor.

9 March 2009

John Goerzen: A Few Days With the Kindle 2

So I am going to do something that nobody on the Internet is doing lately: post a review of the Kindle 2 after having only used it for three days. Shocking, yes, I know. I had never even seen a Kindle of either model before getting the Kindle 2. I had, though, thought about getting an eInk device for some time. The $359 Kindle 2 price tag caused me significant pause, though in the end I went for it due to the 30-day return policy. On the surface, I thought that it would be weird to have a Kindle. After all, how often am I away from the computer? And there s a small local library a few blocks from where I work. But I had a hunch it might turn out for me like my iPod did: something that didn t sound all that useful from reading about it, but turned out to be tremendously so after having actually used it. Turtleback Delivery I ordered my Kindle 2 with standard shipping, which meant that it went by FedEx Smart Post. Here is my SmartPost rant. There are two words in Smart Post that are misleading. I once had an item take literally a week to make it from St. Louis to Kansas. That is, I kid you not, slower than the Pony Express ran in 1860. This time, my Kindle made it from Kentucky to Kansas in a mere five days. Oh, and it spent more than 24 hours just sitting in St. Louis. The Device Overall, the device is physically quite nice. It is larger and thinner than I had imagined, and the screen also is a bit smaller. It is usually easier to hold than a paperback, due to not having to prevent it from closing too far at the binding edge. The buttons are easy to press, though I could occasionally wish for them to be easier, but that s a minor nit. The Screen The most important consideration for me was the screen. The eInk display as both stunningly awesome and disappointing. This is not the kind of display you get on your computer. Or, for that matter, any other device. It isn t backlit. It reacts to light as paper does. It can be viewed from any angle. And it consumes no power to sustain an image; only to change it. Consequently, it puts up a beautiful portrait of a famous author on the screen when it is put to sleep, and consumes no power to maintain it. The screen s response time isn t anywhere near as good as you d expect from a regular LCD. It flashes to black when you turn a page, and there is no scrolling. On the other hand, this is not really a problem. I found the page turning speed to be more than adequate, and probably faster than I d turn the page on a real book. The resolution of the display has the feeling of being incredible. The whole thing provides a far different, and in my eyes superior, experience to reading on an LCD or CRT screen. My nit is the level of contrast. The background is not really a pure white, but more of a light gray. This results in a contrast level that is quite clearly poorer than that of the printed page. At first I thought this would be a serious problem, though I am growing somewhat more used to it as I read more. Reading Experience Overall, I ve got to say that it is a great device. You can easily get lost in a book reading it on the Kindle. I m reading David Copperfield for the first time, and have beat a rather rapid path through the first five chapters on the Kindle already. And that, I think, is the best thing that could be said about an ebook reader. It stays out of the way and lets you immerse yourself in your reading. The Kindle s smartly-integrated Oxford-American Dictionary was useful too. One thing about a novel written 150 years ago is that there are some words I just haven t ever heard. Nosegay, for instance. You can move a cursor to a word to see a brief pop-up definition appear, or press Enter to see the entire entry. This is nice and so easy that I m looking up words I wouldn t have bothered to if I were reading the book any other way. A nosegay, by the way, is a bouquet of showy flowers. Buying Experience The Kindle has a wireless modem tied to the Sprint network on it. The data charges for this, whatever they may be, are absorbed by Amazon in the cost of the device and/or the books you buy for it. This turned out to be a very smart piece of engineering. I discovered on Amazon s Kindle Daily Post that Random House is offering five mostly highly-rated sci-fi books for free on the Kindle for a limited time. So I went over to the page for each, and made my purchase . It was only a click or two, and I saw a note saying it was being delivered. A few minutes later, I picked up the Kindle off the kitchen counter. Sure enough, my purchases were there ready to read. Impressive. This level of ease of use smells an awful lot like Apple. Actually, I think it s surpassed them. You can delete books from the Kindle and re-download them at any time. You can initiate that operation from either the PC or the Kindle. And you can also browse Amazon s Kindle store directly from the device itself. I haven t subscribed to any magazines or newspapers, but I gather that they deliver each new issue automatically the moment it s released by the publisher, in the middle of the night. I pre-ordered the (free to Kindle) Cook s Illustrated How-to-Cook Library. It makes me way happier than it should to see This item will be auto-delivered to your Kindle on March 26 in the order status. Free Books Amazon s Kindle library has a number of completely free Kindle books as well. These are mostly out-of-copyright books, probably sourced from public etext places like Project Gutenberg, and converted to the Mobipocket format that is the Kindle s native format with a minimum of human intervention. As they are free, you can see them in Amazon s library if you sort by price. And, of course, Amazon will transfer them to the Kindle wirelessly, and maintain a copy of them in your amazon.com account. Unfortunately, as with free etexts in general on the Internet, the quality of these varies. I was very annoyed to find that many free etexts look like they were done on a typewriter, rather than professionally printed. They don t use smart quotes; only the straight ones. When reading on a device that normally shows you a faithful print experience, this is jarring. And I spent an inordinate amount of time trying to find a copy of Return of Sherlock Holmes that actually had the graphic figures in Dancing Men. Ah well. Your Own Content Amazon operates a mail server, username@kindle.com. You can email stuff to it, and it will convert it to the Kindle format and wirelessly upload it to your kindle for a fee of $0.10. Alternatively, you can use username@free.kindle.com, which does the same thing at no charge, but emails you back a link to download the converted work to install via USB yourself. I tried it with a number of PDFs. It rejected about a dozen times from only my single mail message a PDF containing graphic images only. However, it does quite well with most text-heavy PDFs notably doing an excellent job with Return of Sherlock Holmes from bookstacks.org the only source I found that was both beautifully typeset and preserved the original figures. Unfortunately, the PDF converter occasionally has troubles identifying what should be a paragraph, particularly in sections of novels dealing with brief dialog. I have also sent it some HTML files to convert, which it also does a great job with. You can also create Mobipocket files yourself and upload them directly. There is a Mobipocket creator, or you can use mobiperl if you are Windows-impaired or prefer something scriptable on Linux. The device presents itself as a USB mass storage device, so you can see it under any OS. There s a documents folder to put your files in. You can back it up with your regular backup tools, too. And it charges over USB. Web Browser I haven t tried it much. It usually works, but seems to be completely down on occasion. It would get by in a pinch, but is not suitable for any serious work. The guys over at XKCD seem to love it; in fact, their blog post was what finally convinced me to try the Kindle in the first place. Final Thoughts I ve ordered a clip-on light and a leather case for the Kindle. The light, I believe, will completely resolve my contrast complaint. The leather case to protect it, of course. I can t really see myself returning the Kindle anymore. It s way too much fun, and it s making it easier to read more again. And really, if Amazon manages to reach out to a whole generation of people and make it easy and fun for them to read again and make a profit doing it, of course they may move up a notch or two from being an evil patent troll company to a positive social force company. Wow, never thought I d say that one.

25 January 2009

John Goerzen: Review: The Economist

A few months ago, I asked for suggestions for magazines to subscribe to. I got a lot of helpful suggestions, and subscribed to three: The New Yorker, The Atlantic, and The Economist. Today, I m reviewing the only one of the three that I m disappointed in, and it s The Economist. This comes as something of a surprise, because so many people (with the exception of Bryan O Sullivan) recommended it. Let s start with a quote from the issue that found its way to my mailbox this week:
A crowd of 2m or more is making its way to Washington, DC, to witness the inauguration of Mr Obama. Billions more will watch it on television. [link]
Every issue, I see this sort of thing all over. An estimate, or an opinion, presented as unquestioned fact, sometimes pretty clearly wrong or misleading. For weeks before Jan. 20, and even the day before, the widely-reported word from officials was that they had no idea what to expect, but if they had to guess, they d say that attendance would be between 1-2 million. In the end, the best estimates have placed attendance at 1.8 million. Would it have killed them to state that most estimates were more conservative, and to cite the source of their particular estimate? That s all I want, really, when they do things like this. I knew going into it that the magazine (to American eyes) essentially editorializes throughout, and I don t have a problem with that. But it engages in over-generalization far too often and that s just when I catch it. This was just a quick example from the first article I read in this issue; it s more blatant other places, but quite honestly I m too lazy to go look some more examples up at this hour. I do remember, though, them referring to members of Obama s cabinet as if they were certain to be, back before Obama had even announced their pick, let alone their confirmation hearings happening. One of my first issues of The Economist had a lengthy section on the global automobile market. I learned a lot about how western companies broke into markets in Asia and South America. Or at least I think I did. I don t know enough about that subject to catch them if they are over-generalizing again. The end result is that I read each issue with a mix of fascination and distrust; the topics are interesting, but I can never really tell if I m being given an accurate story. It often feels like the inside scoop, but then when I have some bit of knowledge of what the scoop is, it s often a much murkier shade of gray than The Economist s ever-confident prose lets on. Don t get me wrong; there are things about the Economist I like. But not as much as with the New Yorker or the Atlantic, so I ll let my subscription lapse after 6 months but keep reading it until then.

13 December 2008

Bernhard R. Link: Ever wondered about java windows staying empty in some WMs?

It's a longstanding bug that java programs show empty gray windows when being used in many window managers. As there is OpenJDK now, I thought: It's free software now, so look at it and perhaps there is a way to fix it. As always, looking at java related stuff is a big mistake, but the code in question speaks volumes. The window configure code has:
        if (!isReparented() && isVisible() && runningWM != XWM.NO_WM
                &&  !XWM.isNonReparentingWM()
                && getDecorations() != winAttr.AWT_DECOR_NONE)  
            insLog.fine("- visible but not reparented, skipping");
            return;
         
and if you wonder how it detects if there is a non-reparenting window manager, it does it by:
     static boolean isNonReparentingWM()  
        return (XWM.getWMID() == XWM.COMPIZ_WM   XWM.getWMID() == XWM.LG3D_WM);
      
Yes, it really has a big list of 12 window managers built in for which it tests. And this is not the only place where it has special cases for some, but it does so all the time in the different places. But what Sun did not think about: There are more than 12 window managers out there. And with this buggy code it would need a list of every single one not doing reparenting (like ratpoison as when I read the bug reports correctly also awesome, wmii and a whole list of quite popular ones, too). Or it means that you are not supposed to run graphical java applications unless you use openlook, mwm (motif), dtwm (cde), enlightenment, kwm (kde), sawfish, icewm, metacity, compiz or lookinglass or no window manager at all. As I did not yet had realized that the old workaround of AWT_TOOLKIT=MToolkit no longer works in lenny before reading some debian-release mail, which means I haven't use any graphical java program for a long time, it seems I have decided for the latter. P.S.: I've sent a patch that one can at least manually tell java that one would like to see windows' contents as b.d.o/508650

6 November 2008

Pablo Lorenzzoni: Bash prompts: the essential

Bash is probably the most common command-line shell in the GNU/Linux world. Although a lot of people use alternate shells (such as Zsh), Bash is still shipped with most mainstream distros as the default. Once you have a lot of different remote machines, all running Bash as the shell, it becomes increasingly difficult to pay attention to the prompt, and typing reboot in a machine different from the one you wanted becomes more likely. I deal with that problem by changing Bash prompts First of all, the basics: Bash prompts are just environment variables with special characters you can set and export. Bash has four of these variables: PS1 to PS4, but usually only the first two matters (actually, just PS1 for a reference on the others, check the manpage). The most common PS1 string is:

spectra@home:~$ echo $PS1
\u@\h:\w\$
spectra@home:~$
This has 4 special characters, escaped with a backslash: \u informing us the username; \h informing us the hostname; \w, informing us the working directory; and \$, which gives us the $ in the end of the prompt (more on this later). So, essentially, one can change that string to anything else

spectra@home:~$ PS1="my_shell_prompt\$ " 
my_shell_prompt$
Pretty easy. You can check a complete reference of the special characters at the section PROMPTING of bash manpage, but the most useful IMHO are the following: Also, as part of the prompt string, one can use ANSI Colors enclosed as non-printing characters (that is between \[ and \]). ANSI sequences always begin with an ESC[ and end with an m . (Yes Really arbitrary but that s the way it is ). ESC can be represented as \e Here is a list of the most common colors in ANSI sequences: Now, notice that there are two numbers separated by a semi-colon the first is always 0 (zero) in the colors I pointed above, but it actually refers to an ANSI attribute called Select Graphic Rendition You can use 0 (zero) to normal colors, 1 for bold, 2 to faint, etc. So \e[0;30m refers to BLACK, \e[1;30m refers to DARK GREY. The Wikipedia has a good article on these escape sequences. Once you re satisfied with something printed in a color, to go back to the default (to reset), you issue the \e[0m escape sequence. So, back to my problem Each different machine gets a different color for the hostname. On hospital machine, for instance, my PS1 looks like:

spectra@hospital:~$ PS1="\[\e[1;33m\]\u\[\e[0m\]@\[\e[0;35m\]\h\[\e[0m\]:\[\e[0;32m\]\w\[\e[0m\]\$ " 
spectra@hospital:~$
With \e[0;35m (Purple) for the hostname. On home machine, it may be \e[0;34m (Blue)... On server , it may be \e[0;36m (Cyan), and so on After a while, you get used to the color and end up linking the color to the machine so that typing reboot on a machine with the wrong color gets harder than before. To make the changes permanent, put export PS1 in one of the config script of bash (.bashrc, .bash_profile, etc). On some systems, /etc/environment holds lots of environment variables definitions. I just scratched the surface That s just what works for me The Bash-Prompt-HOWTO has some interesting examples, and I actually have a friend who uses more esoterical stuff, such as fancybash or bashish, but I ll leave this up to you

26 October 2008

Russell Coker: Links October 2008

Here’s a blog post suggesting that anti-depressant drugs such as Prozac may have helped the US mortgage crisis [1]. Apparently such drugs cause poor impulse control, so it wouldn’t be a good idea to attend a house auction while using them. Here’s an interesting idea about lecturing, give 20 minute talks with something else (practical work or group discussion) in between [2]. Michael Lee wants to “capture the power of that strict time limit, the intensity of a well-crafted 20 minutes”. While I’m not sure that a strict time limit is such a great idea. Having talks broken up into sections sounds like it has the potential to offer some benefits. A bible from the 4th century has been found and is being digitised [3]. When the digitisation is complete (next year) it will be published on the net so everyone can see how the bible has changed over the years. Interesting interview with Jim Gray (of MS Research) about storage [4]. It was conducted in 2003 so technology has moved on, but the concepts remain. His ideas for sharing two terabytes of data by using a courier to deliver an NFS or CIFS file server are interesting, the same thing could be done today with five terabytes for a lower cost. Techtarget has a white paper sponsored by Intel about the price/performance of data centers in low-density and high-density designs [5]. I don’t think I’ll ever be in a position to design a data center, but the background information in the paper is very useful. Google has an interesting set of pages describing their efforts to save power in their data centers [6]. They claim to have the most efficient server rooms ever built, and describe how it saves them a lot of money. One of the interesting things that they do is to use evaporative cooling as the primary cooling method. They also have a RE<C (Renewable Energy cheaper than Coal) project [7]. Here’s a Youtube video of an interesting presentation by Andy Thomson (a psychiatrist at the University of Virginia) about male-bonded coalitionary violence [8]. He shows the evidence of it in chimpanzees, humans, and evidence for it being in the common ancestry of chimps and humans (5-6 million years ago). He also shows a link to modern suicide bombing. It’s widely regarded that Cyrus is the fastest IMAP server. Linux-Magazin.de published an article last year comparing Cyrus, UW-IMAP, Dovecot, and Courier and the conclusion is that Courier and Dovecot are the winners [9]. I used Google Translation but the results were not particularly good so I think I missed some of the points that they were trying to make.

25 October 2008

John Goerzen: A Tale of Three Monitors

For quite a few years now, I've been using a Dell 2001FP 20" LCD. Aside from the fact that this is the monitor that has apparently placed me on Dell's "permanently harass with paper catalogs" list (BTW, USPS has sent them a nasty enforcement letter over that), and the fact that when it needed warranty repair, Dell didn't know how to do it because it wasn't connected to a Dell computer, it's been a good monitor.

With Terah's recent conversion to Linux, she needed a monitor. We decided she could use the 20" LCD, and I'd get a new monitor. I have a nice 24" HP LP2465 at work, 1920x1200, so I thought I'd hop on newegg and see what 24" monitors I could find.

I found a Samsung T240 over there. Apparently Samsung's premium "touch of color" line. I did some research. It got glowing reviews -- people seemed to think the display was brilliant, and the only problem was that its stand wasn't height-adjustable. It got rave reviews on Newegg. It was $400 after rebate. And it arrived here Thursday.

I opened up the box, and I have to tell you that it looks beautiful. The bezel is this reflective black with a hint of dark red at the bottom. I put it together, hooked it up, and...

Well, let's just say that "Touch of Color" isn't all that accurate. "Touch of Crap" is more like it.

As I write this, my Firefox window covers half the screen. The center part of it is this brilliant color. The lower part has the white faded to this sickening blue, and the bottom right corner has almost turned to a black-and-white image. While the top part of my screen simply is more contrasty, except for the very top of the screen, which has half an inch of brighter area than anywhere else.

In short, the viewing angle on here is so bad that sitting directly in front of the screen gives significant color variations. An xterm with a black background maximized to fill the screen has a black background some places and a gray background others. My solid midnightblue background looks more sky blue at some places and more... really dark blue... at others. And of course, moving my head slightly varies the color that I see. Also, on photos, there is very visible horizontal banding in them -- apparently some sort of dithering going on since this is only a 6-bit display. Ugh.

How on earth this monitor got glowing reviews, I have no idea. I learned a bit about LCD panel types. Apparently the T240 is a TN display. The horror. I'll never buy another one of those again.

The HP LP2465 that I have at work, and which is great with solid colors all across it, is an S-PVA. So I'm buying one of those for home. It's more expensive, but -- it's a good display.

I am still trying to figure out exactly who thinks this is a good monitor. I tried games on it. It was responsive, but again, colors were distorted due to viewing angle. And this distortion means it's completely useless for photo editing, where precise colors are important.

I bought it from Newegg, and their policy generally prohibits refunds on LCDs. I called them, explained the situation, and told them that I wanted to buy an HP LP2465 from them to replace it. After being on hold for a few minutes, they agreed to waive their policy -- and even the 15% restocking fee, once I placed the order for the HP monitor. Classy show, Newegg. That's the one bright spot in all this.

11 October 2008

Bernhard R. Link: Iceweasel 3

Trying to get prepared for lenny, the new iceweasel annoys me more and more.

12 September 2008

Adrian von Bidder: 25 pair telco cable color scheme

I had to rewire a DSL concentrator (thunderstorm blew one of the ports...) today. The concentrator has two fan-out cables (input, output) with 25 RJ-11 connectors, color coded, so I had to find the colors of the remaining non-defective ports (there are 12 ports on this concentrator, the other 13 pairs are not connected. Presumably there isn't a 12 pair cable.) Google shows tons of references to the apparently standard 25 pair color code (first pair is white/blue), but unfortunately “my” cable had a light blue/light yellow first pair with most other pairs made up of cables with a base color and a colored stripe. I couldn't find the color chart on the Internet. Finally, the strange thought of RTFM entered my mind (and I even found the manual of the concentrator), and found that the importer has added a color chart leaflet to it; the cable is referred to as a “Telco50” cable. So here we go:
PairFirst WireSecond Wire
1redred — white
2yellowyellow — black
3greengreen — white
4blueblue — white
5brownbrown — white
6blackblack — white
7purplepurple — white
8orangeorange — white
9light greengreen — black
10blue — blackpurple — black
11light blue — blacklight blue — red
12light green — greenlight green — blue
13light green — blacklight green — red
14light blue — bluelight blue — green
15light yellow — redlight yellow — black
16light yellow — greenlight yellow — blue
17graygray — black
18gray — greengray — red
19red — blacklight red
20light red — bluelight red — green
21light red — blacklight red — red
12whiteorange — black
13white — bluewhite — green
24white — redwhite — black
25light bluelight yellow
(Actually, on the DSL concentrator here this order is listed as “pin” numbers while the 12 ports are assigned in reverse order, starting at 25. The concentrator is a Zyxel VES-1012 and is not in production anymore.)

14 July 2008

Evan Prodromou: 26 Messidor CCXVI

Since the launch of Identi.ca a few weeks ago, I've had a very busy time. Not much sleep, but lots of fulfilling and exciting work. It's invigorating to work on something that is popular and that you believe in. And I'm glad that the Franklin Street Statement so succinctly encapsulates those beliefs. As some people may have read in my previous blog post about the motivations behind creating Identi.ca (see Journal/14 Messidor CCXVI), I was part of a group convened at the FSF to discuss the impact of the growth of software as a service on user autonomy. It was a very loose organization of hackers, activists, and scholars who come from different backgrounds but all share an interest in user rights online. As computing moves into "the cloud" (see cloud computing), what power does the user retain to control their own computing experience? As much of our social lives -- romance, family, work, friends -- becomes Web-enhanced, what can we do to assert our right to manage that data and its use? How can software developers and service providers gauge their own proper ethical behaviour, and how can users of services judge what is and is not acceptable to use? We didn't come up with any easy answers, but we've summarized our thinking in a new document: the Franklin Street Statement on Freedom and Network Services. In essence, we've tried to point a direction towards what software developers, service providers, and software users should think hard about when thinking about network services. [image] Our group will continue to explore these issues on our new group blog, http://autonomo.us/ . We're going to concentrate on the effects of software services on user autonomy -- people's ability to make their own informed choices about their data and creative works and the software that processes them. It is a realm that as a society -- a cluster of societies -- we're only beginning to understand, and I think that there is still a lot of exploration to do. Autonomo.us includes a wiki where we'll be exploring some of these ideas, and the blog will feature guest submissions about the subject. We're a loose and unofficial group with some smiling benevolence but no sanction from the FSF or any related organization, so we're really going to dig into some of the gray areas of this issue without worrying about making official statements for any one organization. I'm looking forward to the coming months and I hope this issue captures the imagination of the Web's Open Source and Open Content communities. P.S. You can see the FSF press release about the statement and the launch of autonomo.us. tags:

Open Service Definition On a related note, one of the first organizations to approach the issue of Open Services has been the Open Knowledge Foundation. Today they've launched the 1.0 version of the Open Software Service Definition (OSSD). I think the OSSD 1.0 is a great step. It's a bar against which we can start measuring Open Software Services. For example, I think that Identi.ca meets the requirements of the OSSD. Other sites, like Wikipedia, are also clearly compliant. Some services that I really like, such as OpenStreetMap and geonames.org, seem to be compliant. But are they? It's good to start this investigation. I think other kinds of services are on their way there. The announcement by Reddit that their code will now be Open Source is a great step for user autonomy in the social news arena. Now, Reddit needs to consider what an Open Content/Open Data policy would mean for their service... or see others implement it on other sites. I look forward to a rich ecology of open software services growing, now that we have a name for it and a clear community of people interested in the topic. tags:

20 June 2008

Manoj Srivastava: Manoj: The Command Prompt

Or theming PS1 I am not fanatical about my command prompt. No sirree, not me. It is just that I spend half my life either staring at an Emacs window, or at an xterm command prompt; so even a marginal boost in productivity goes a long way. And I am often logged in over ssh to machines half a continent away, and am still comfortable enough on the Linux VT to spend time there, and I often do not have the GUI gew-gaws feeding me data. Hmm. Data. inpuuut. No. Must focus. With all these open xterms and Emacs terminal mode frames floating around, it is easy to lose track of where I am on each terminal, and what the working directory is. So I want my command prompt to help me keep track of where I am. If the terminal is an xterm, the title can be setup like
 "user@host:../shotened/path/to/current/working/dir"
I want to know what machine, id (am I root?), and directory I am in. If I am deep down in the labyrinths of some work related directory tree, I want the path to be pruned, from the left, one component at a time. However, this does not help me on the console; so I also want the path to be in the command prompt; but it should not take up too much of the command line; and ideally, should just go away as I type a longish command. Gawd, I love zsh. I once did a analysis of the command history. The most often used command sequences were
 cd some-place, ls
So. pwd and ls. I really really want to see the directory listing when I change directory to a new one. As one grows old, memorizing the directory contents for dozens of machines all the time taxes the gray cells a bit. So, I figure, why not let the command prompt handle all that? Having the current working dir always visible cuts down a heck of a lot on the pwd commands, and so all that s left is to insist that the command prompt thingy always run ls after a change directory. Simple enough. Saves on typing. And time. And this is not just some crazy talk. I want help with noticing whether the previous command exited with an error status (useful for commands that normally do not create an output). If I am logged in to a machine on battery power, I want to know that. I also like visual cues to the amount of power remaining (good for my laptop on the long flights). I don t want to have to know f the machine uses APM or ACPI, I just want my prompt color to change as the power fades. I want my command prompt to let me know if I am in a directory which is under version control, and if so, what branch I am on. (I occasionally have to come in contact with arch, bzr, git, subversion, svk, and mercurial). If I am in a version controlled project, where I am relative to the root of the checked out tree is often more important than the absolute path, so I want to see relative paths, not absolute paths. I want to know if there are uncommitted files in the working directory. Visually. I want to be reminded if I am in the middle of an ongoing
 rebase -i .
This is not asking for too much, is it? Command Prompt So, here is a screen-shot of this in action: I start from my home directory, go to a directory not under version control, go to a project under git, then a different project with uncommitted files, and then finally to a subversion checkout. All with an angry fruit salad of colors warrantied to make Martin Krafft want to claw his eyes out biggrin.png . If you use zsh, then just grab hold of this, and then do:
 autoload -U promptinit
 promptinit
 prompt manoj
This might be a bit of a hit on slow machines, but even my laptop is a core 2 duo, so I do not find it noticeable. Oh, and if you liked this article, you might also like Theming Emacs, and Theming XTerms.

Next.

Previous.